57 research outputs found
Pseudo-random graphs and bit probe schemes with one-sided error
We study probabilistic bit-probe schemes for the membership problem. Given a
set A of at most n elements from the universe of size m we organize such a
structure that queries of type "Is x in A?" can be answered very quickly.
H.Buhrman, P.B.Miltersen, J.Radhakrishnan, and S.Venkatesh proposed a bit-probe
scheme based on expanders. Their scheme needs space of bits, and
requires to read only one randomly chosen bit from the memory to answer a
query. The answer is correct with high probability with two-sided errors. In
this paper we show that for the same problem there exists a bit-probe scheme
with one-sided error that needs space of O(n\log^2 m+\poly(\log m)) bits. The
difference with the model of Buhrman, Miltersen, Radhakrishnan, and Venkatesh
is that we consider a bit-probe scheme with an auxiliary word. This means that
in our scheme the memory is split into two parts of different size: the main
storage of bits and a short word of bits that is
pre-computed once for the stored set A and `cached'. To answer a query "Is x in
A?" we allow to read the whole cached word and only one bit from the main
storage. For some reasonable values of parameters our space bound is better
than what can be achieved by any scheme without cached data.Comment: 19 page
On the Combinatorial Version of the Slepian-Wolf Problem
We study the following combinatorial version of the Slepian-Wolf coding
scheme. Two isolated Senders are given binary strings and respectively;
the length of each string is equal to , and the Hamming distance between the
strings is at most . The Senders compress their strings and
communicate the results to the Receiver. Then the Receiver must reconstruct
both strings and . The aim is to minimise the lengths of the transmitted
messages.
For an asymmetric variant of this problem (where one of the Senders transmits
the input string to the Receiver without compression) with deterministic
encoding a nontrivial lower bound was found by A.Orlitsky and K.Viswanathany.
In our paper we prove a new lower bound for the schemes with syndrome coding,
where at least one of the Senders uses linear encoding of the input string.
For the combinatorial Slepian-Wolf problem with randomized encoding the
theoretical optimum of communication complexity was recently found by the first
author, though effective protocols with optimal lengths of messages remained
unknown. We close this gap and present a polynomial time randomized protocol
that achieves the optimal communication complexity.Comment: 20 pages, 14 figures. Accepted to IEEE Transactions on Information
Theory (June 2018
Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory
It is known that the mutual information, in the sense of Kolmogorov
complexity, of any pair of strings x and y is equal to the length of the
longest shared secret key that two parties can establish via a probabilistic
protocol with interaction on a public channel, assuming that the parties hold
as their inputs x and y respectively. We determine the worst-case communication
complexity of this problem for the setting where the parties can use private
sources of random bits. We show that for some x, y the communication complexity
of the secret key agreement does not decrease even if the parties have to agree
on a secret key whose size is much smaller than the mutual information between
x and y. On the other hand, we discuss examples of x, y such that the
communication complexity of the protocol declines gradually with the size of
the derived secret key. The proof of the main result uses spectral properties
of appropriate graphs and the expander mixing lemma, as well as information
theoretic techniques.Comment: 33 pages, 6 figures. v3: the full version of the MFCS 2020 pape
Topological arguments for Kolmogorov complexity
We present several application of simple topological arguments in problems of
Kolmogorov complexity. Basically we use the standard fact from topology that
the disk is simply connected. It proves to be enough to construct strings with
some nontrivial algorithmic properties.Comment: Extended versio
Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory
It is known that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parties can use private sources of random bits.
We show that for some x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key the size of which is much smaller than the mutual information between x and y. On the other hand, we provide examples of x, y such that the communication complexity of the protocol declines gradually with the size of the derived secret key.
The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma as well as various information theoretic techniques
1D Effectively Closed Subshifts and 2D Tilings
Michael Hochman showed that every 1D effectively closed subshift can be
simulated by a 3D subshift of finite type and asked whether the same can be
done in 2D. It turned out that the answer is positive and necessary tools were
already developed in tilings theory. We discuss two alternative approaches:
first, developed by N. Aubrun and M. Sablik, goes back to Leonid Levin; the
second one, developed by the authors, goes back to Peter Gacs.Comment: Journ\'ees Automates Cellulaires, Turku : Finland (2010
- …